LETTERS TRANSPOSED INTO NUMBER REARRANGED IN NUMERICAL ORDER
SCHRODINGERS CAT
AUTUMN ATUM AUTUMN
AUTUMN ATUM AUTUMN QUANTUM ATUM QUANTUM
RA ATUM ATUM RA
QUANTUM THOUGHTS GODS THOUGHTS QUANTUM
QUANTUM ATUM AMUN ATEN ATOM QUANTUM
Quantum - Wikipedia, the free encyclopedia
Quantum In physics, a quantum (plural: quanta) is an indivisible entity of a quantity that has the units as the Planck constant and is related to both energy and momentum of elementary particles of matter (called fermions) and of photons and other bosons. The word comes from the Latin "quantus", for "how much." Behind this, one finds the fundamental notion that a physical property may be "quantized", referred to as "quantization". This means that the magnitude can take on only certain discrete numerical values, rather than any value, at least within a range. There is a related term of quantum number. A photon is often referred to as a "light quantum". The energy of an electron bound to an atom (at rest) is said to be quantized, which results in the stability of atoms, and of matter in general. But these terms can be a little misleading, because what is quantized is this Planck's constant quantity whose units can be viewed as either energy multiplied by time or momentum multiplied by distance. Usually referred to as quantum "mechanics", it is regarded by virtually every professional physicist as the most fundamental framework we have for understanding and describing nature at the infinitesimal level, for the very practical reason that it works. It is "in the nature of things", not a more or less arbitrary human preference. Contents [hide] [edit] Development of quantum theory Planck was reluctant to accept the new idea of quantization, as were many others. But, with no acceptable alternative, he continued to work with the idea, and found his efforts were well received. Eighteen years later, when he accepted the Nobel Prize in Physics for his contributions, he called it "a few weeks of the most strenuous work" of his life. During those few weeks, he even had to discard much of his own theoretical work from the preceding years. Quantization turned out to be the only way to describe the new and detailed experiments which were just then being performed. He did this practically overnight, openly reporting his change of mind to his scientific colleagues, in the October, November, and December meetings of the German Physical Society, in Berlin, where the black body work was being intensely discussed. In this way, careful experimentalists (including Friedrich Paschen, O.R. Lummer, Ernst Pringsheim, Heinrich Rubens, and F. Kurlbaum), and a reluctant theorist, ushered in a momentous scientific revolution. [edit] The quantum black-body radiation formula The quantum black-body radiation formula, being the very first piece of quantum mechanics, appeared Sunday evening October 7, 1900, in a so-called back-of-the-envelope calculation by Planck. It was based on a report by Rubens (visiting with his wife) on the very latest experimental findings in the infrared. Later that evening, Planck sent the formula on a postcard, which Rubens received the following morning. A couple of days later, he informed Planck that it worked perfectly. At first, it was just a fit to the data; only later did it turn out to enforce quantization. This second step was only possible due to a certain amount of luck (or skill, even though Planck himself called it "a fortuitous guess at an interpolation formula"). It was during the course of polishing the mathematics of his formula that Planck stumbled upon the beginnings of Quantum Theory. Briefly stated, he had two mathematical expressions: (i) from the previous work on the red parts of the spectrum, he had x; This is (essentially) what is being compared with the experimental measurements. There are two parameters to determine from the data, written in the present form by the symbols used today: h is the new Planck's constant, and k is Boltzmann's constant. Both have now become fundamental in physics, but that was by no means the case at the time. The "elementary quantum of energy" is hλ. But such a unit does not normally exist, and is not required for quantization. [edit] Beyond electromagnetic radiation [edit] The birth of quantum mechanics [edit] See also [edit] References
Quantum mechanics - Wikipedia, the free encyclopedia Quantum mechanics Mathematical formulation of... [show]Background Certain systems, however, do exhibit quantum mechanical effects on a larger scale; superfluidity (the frictionless flow of a liquid at temperatures near absolute zero) is one well-known example. Quantum theory also provides accurate descriptions for many previously unexplained phenomena such as black body radiation and the stability of electron orbits. It has also given insight into the workings of many different biological systems, including smell receptors and protein structures.[3] Even so, classical physics often can be a good approximation to results otherwise obtained by quantum physics, typically in circumstances with large numbers of particles or large quantum numbers. (However, some open questions remain in the field of quantum chaos.) Contents [hide] [edit] Overview Quantum mechanics is essential to understand the behavior of systems at atomic length scales and smaller. For example, if classical mechanics governed the workings of an atom, electrons would rapidly travel towards and collide with the nucleus, making stable atoms impossible. However, in the natural world the electrons normally remain in an uncertain, non-deterministic "smeared" (wave-particle wave function) orbital path around or "through" the nucleus, defying classical electromagnetism.[7] Quantum mechanics was initially developed to provide a better explanation of the atom, especially the spectra of light emitted by different atomic species. The quantum theory of the atom was developed as an explanation for the electron's staying in its orbital, which could not be explained by Newton's laws of motion and by Maxwell's laws of classical electromagnetism.[8] In the formalism of quantum mechanics, the state of a system at a given time is described by a complex wave function (sometimes referred to as orbitals in the case of atomic electrons), and more generally, elements of a complex vector space.[9] This abstract mathematical object allows for the calculation of probabilities of outcomes of concrete experiments. For example, it allows one to compute the probability of finding an electron in a particular region around the nucleus at a particular time. Contrary to classical mechanics, one can never make simultaneous predictions of conjugate variables, such as position and momentum, with arbitrary accuracy. For instance, electrons may be considered to be located somewhere within a region of space, but with their exact positions being unknown. Contours of constant probability, often referred to as “clouds” may be drawn around the nucleus of an atom to conceptualize where the electron might be located with the most probability. Heisenberg's uncertainty principle quantifies the inability to precisely locate the particle given its conjugate.[10] The other exemplar that led to quantum mechanics was the study of electromagnetic waves such as light. When it was found in 1900 by Max Planck that the energy of waves could be described as consisting of small packets or quanta, Albert Einstein exploited this idea to show that an electromagnetic wave such as light could be described by a particle called the photon with a discrete energy dependent on its frequency. This led to a theory of unity between subatomic particles and electromagnetic waves called wave–particle duality in which particles and waves were neither one nor the other, but had certain properties of both. While quantum mechanics describes the world of the very small, it also is needed to explain certain “macroscopic quantum systems” such as superconductors and superfluids.[11] Broadly speaking, quantum mechanics incorporates four classes of phenomena that classical physics cannot account for: (I) the quantization (discretization) of certain physical quantities, (II) wave-particle duality, (III) the uncertainty principle, and (IV) quantum entanglement. Each of these phenomena is described in detail in subsequent sections.[11] [edit] History Main article: History of quantum mechanics where h is Planck's Action Constant. Planck insisted[13] that this was simply an aspect of the processes of absorption and emission of radiation and had nothing to do with the physical reality of the radiation itself. However, this did not explain the photoelectric effect (1839), i.e. that shining light on certain materials can function to eject electrons from the material. In 1905, basing his work on Planck’s quantum hypothesis, Albert Einstein[14] postulated that light itself consists of individual quanta. These later came to be called photons (1926). From Einstein's simple postulation was born a flurry of debating, theorizing and testing, and thus, the entire field of quantum physics. [edit] Quantum mechanics and classical physics The main differences between classical and quantum theories have already been mentioned above in the remarks on the Einstein-Podolsky-Rosen paradox. Essentially the difference boils down to the statement that quantum mechanics is coherent (addition of amplitudes), whereas classical theories are incoherent (addition of intensities). Thus, such quantities as coherence lengths and coherence times come into play. For microscopic bodies the extension of the system is certainly much smaller than the coherence length; for macroscopic bodies one expects that it should be the other way round.[16] This is in accordance with the following observations: Many “macroscopic” properties of “classic” systems are direct consequences of quantum behavior of its parts. For example, stability of bulk matter (which consists of atoms and molecules which would quickly collapse under electric forces alone), rigidity of this matter, mechanical, thermal, chemical, optical and magnetic properties of this matter—they are all results of interaction of electric charges under the rules of quantum mechanics.[17] While the seemingly exotic behavior of matter posited by quantum mechanics and relativity theory become more apparent when dealing with extremely fast-moving or extremely tiny particles, the laws of classical “Newtonian” physics still remain accurate in predicting the behavior of surrounding (“large”) objects—of the order of the size of large molecules and bigger—at velocities much smaller than the velocity of light.[18] [edit] Theory In this formulation, the instantaneous state of a quantum system encodes the probabilities of its measurable properties, or "observables". Examples of observables include energy, position, momentum, and angular momentum. Observables can be either continuous (e.g., the position of a particle) or discrete (e.g., the energy of an electron bound to a hydrogen atom).[22] Generally, quantum mechanics does not assign definite values to observables. Instead, it makes predictions using probability distributions; that is, the probability of obtaining possible outcomes from measuring an observable. Oftentimes these results are skewed by many causes, such as dense probability clouds[23] or quantum state nuclear attraction.[24][25] Naturally, these probabilities will depend on the quantum state at the "instant" of the measurement. Hence, uncertainty is involved in the value. There are, however, certain states that are associated with a definite value of a particular observable. These are known as "eigenstates" of the observable ("eigen" can be roughly translated from German as inherent or as a characteristic[26]). In the everyday world, it is natural and intuitive to think of everything (every observable) as being in an eigenstate. Everything appears to have a definite position, a definite momentum, a definite energy, and a definite time of occurrence. However, quantum mechanics does not pinpoint the exact values of a particle for its position and momentum (since they are conjugate pairs) or its energy and time (since they too are conjugate pairs); rather, it only provides a range of probabilities of where that particle might be given its momentum and momentum probability. Therefore, it is helpful to use different words to describe states having uncertain values and states having definite values (eigenstate). For example, consider a free particle. In quantum mechanics, there is wave-particle duality so the properties of the particle can be described as the properties of a wave. Therefore, its quantum state can be represented as a wave of arbitrary shape and extending over space as a wave function. The position and momentum of the particle are observables. The Uncertainty Principle states that both the position and the momentum cannot simultaneously be measured with full precision at the same time. However, one can measure the position alone of a moving free particle creating an eigenstate of position with a wavefunction that is very large (a Dirac delta) at a particular position x and zero everywhere else. If one performs a position measurement on such a wavefunction, the result x will be obtained with 100% probability (full certainty). This is called an eigenstate of position (mathematically more precise: a generalized position eigenstate (eigendistribution)). If the particle is in an eigenstate of position then its momentum is completely unknown. On the other hand, if the particle is in an eigenstate of momentum then its position is completely unknown. [27] In an eigenstate of momentum having a plane wave form, it can be shown that the wavelength is equal to h/p, where h is Planck's constant and p is the momentum of the eigenstate.[28] Usually, a system will not be in an eigenstate of the observable we are interested in. However, if one measures the observable, the wavefunction will instantaneously be an eigenstate (or generalized eigenstate) of that observable. This process is known as wavefunction collapse, a debatable process.[29] It involves expanding the system under study to include the measurement device. If one knows the corresponding wave function at the instant before the measurement, one will be able to compute the probability of collapsing into each of the possible eigenstates. For example, the free particle in the previous example will usually have a wavefunction that is a wave packet centered around some mean position x0, neither an eigenstate of position nor of momentum. When one measures the position of the particle, it is impossible to predict with certainty the result.[30] It is probable, but not certain, that it will be near x0, where the amplitude of the wave function is large. After the measurement is performed, having obtained some result x, the wave function collapses into a position eigenstate centered at x.[31] Wave functions can change as time progresses. An equation known as the Schrödinger equation describes how wave functions change in time, a role similar to Newton's second law in classical mechanics. The Schrödinger equation, applied to the aforementioned example of the free particle, predicts that the center of a wave packet will move through space at a constant velocity, like a classical particle with no forces acting on it. However, the wave packet will also spread out as time progresses, which means that the position becomes more uncertain. This also has the effect of turning position eigenstates (which can be thought of as infinitely sharp wave packets) into broadened wave packets that are no longer position eigenstates.[32] Some wave functions produce probability distributions that are constant or independent of time, such as when in a stationary state of constant energy, time drops out of the absolute square of the wave function. Many systems that are treated dynamically in classical mechanics are described by such "static" wave functions. For example, a single electron in an unexcited atom is pictured classically as a particle moving in a circular trajectory around the atomic nucleus, whereas in quantum mechanics it is described by a static, spherically symmetric wavefunction surrounding the nucleus (Fig. 1). (Note that only the lowest angular momentum states, labeled s, are spherically symmetric).[33] The time evolution of wave functions is deterministic in the sense that, given a wavefunction at an initial time, it makes a definite prediction of what the wavefunction will be at any later time.[34] During a measurement, the change of the wavefunction into another one is not deterministic, but rather unpredictable, i.e., random. A time-evolution simulation can be seen here.[1] The probabilistic nature of quantum mechanics thus stems from the act of measurement. This is one of the most difficult aspects of quantum systems to understand. It was the central topic in the famous Bohr-Einstein debates, in which the two scientists attempted to clarify these fundamental principles by way of thought experiments. In the decades after the formulation of quantum mechanics, the question of what constitutes a "measurement" has been extensively studied. Interpretations of quantum mechanics have been formulated to do away with the concept of "wavefunction collapse"; see, for example, the relative state interpretation. The basic idea is that when a quantum system interacts with a measuring apparatus, their respective wavefunctions become entangled, so that the original quantum system ceases to exist as an independent entity. For details, see the article on measurement in quantum mechanics.[35] [edit] Mathematical formulation The time evolution of a quantum state is described by the Schrödinger equation, in which the Hamiltonian, the operator corresponding to the total energy of the system, generates time evolution. The inner product between two state vectors is a complex number known as a probability amplitude. During a measurement, the probability that a system collapses from a given initial state to a particular eigenstate is given by the square of the absolute value of the probability amplitudes between the initial and final states. The possible results of a measurement are the eigenvalues of the operator - which explains the choice of Hermitian operators, for which all the eigenvalues are real. We can find the probability distribution of an observable in a given state by computing the spectral decomposition of the corresponding operator. Heisenberg's uncertainty principle is represented by the statement that the operators corresponding to certain observables do not commute. The Schrödinger equation acts on the entire probability amplitude, not merely its absolute value. Whereas the absolute value of the probability amplitude encodes information about probabilities, its phase encodes information about the interference between quantum states. This gives rise to the wave-like behavior of quantum states. It turns out that analytic solutions of Schrödinger's equation are only available for a small number of model Hamiltonians, of which the quantum harmonic oscillator, the particle in a box, the hydrogen molecular ion and the hydrogen atom are the most important representatives. Even the helium atom, which contains just one more electron than hydrogen, defies all attempts at a fully analytic treatment. There exist several techniques for generating approximate solutions. For instance, in the method known as perturbation theory one uses the analytic results for a simple quantum mechanical model to generate results for a more complicated model related to the simple model by, for example, the addition of a weak potential energy. Another method is the "semi-classical equation of motion" approach, which applies to systems for which quantum mechanics produces weak deviations from classical behavior. The deviations can be calculated based on the classical motion. This approach is important for the field of quantum chaos. An alternative formulation of quantum mechanics is Feynman's path integral formulation, in which a quantum-mechanical amplitude is considered as a sum over histories between initial and final states; this is the quantum-mechanical counterpart of action principles in classical mechanics. [edit] Interactions with other scientific theories Unsolved problems in physics: In the correspondence limit of quantum mechanics: Is there a preferred interpretation of quantum mechanics? How does the quantum description of reality, which includes elements such as the "superposition of states" and "wavefunction collapse", give rise to the reality we perceive?When quantum mechanics was originally formulated, it was applied to models whose correspondence limit was non-relativistic classical mechanics. For instance, the well-known model of the quantum harmonic oscillator uses an explicitly non-relativistic expression for the kinetic energy of the oscillator, and is thus a quantum version of the classical harmonic oscillator. Early attempts to merge quantum mechanics with special relativity involved the replacement of the Schrödinger equation with a covariant equation such as the Klein-Gordon equation or the Dirac equation. While these theories were successful in explaining many experimental results, they had certain unsatisfactory qualities stemming from their neglect of the relativistic creation and annihilation of particles. A fully relativistic quantum theory required the development of quantum field theory, which applies quantization to a field rather than a fixed set of particles. The first complete quantum field theory, quantum electrodynamics, provides a fully quantum description of the electromagnetic interaction. The full apparatus of quantum field theory is often unnecessary for describing electrodynamic systems. A simpler approach, one employed since the inception of quantum mechanics, is to treat charged particles as quantum mechanical objects being acted on by a classical electromagnetic field. For example, the elementary quantum model of the hydrogen atom describes the electric field of the hydrogen atom using a classical Coulomb potential. This "semi-classical" approach fails if quantum fluctuations in the electromagnetic field play an important role, such as in the emission of photons by charged particles. Quantum field theories for the strong nuclear force and the weak nuclear force have been developed. The quantum field theory of the strong nuclear force is called quantum chromodynamics, and describes the interactions of the subnuclear particles: quarks and gluons. The weak nuclear force and the electromagnetic force were unified, in their quantized forms, into a single quantum field theory known as electroweak theory, by the physicists Carl Jamieson, Sheldon Glashow and Steven Weinberg. It has proven difficult to construct quantum models of gravity, the remaining fundamental force. Semi-classical approximations are workable, and have led to predictions such as Hawking radiation. However, the formulation of a complete theory of quantum gravity is hindered by apparent incompatibilities between general relativity, the most accurate theory of gravity currently known, and some of the fundamental assumptions of quantum theory. The resolution of these incompatibilities is an area of active research, and theories such as string theory are among the possible candidates for a future theory of quantum gravity. [edit] Example The general solutions are: or (by Euler's formula). Consider x = 0 sin 0 = 0, cos 0 = 1. To satisfy the cos term has to be removed. Hence D = 0. at x = L, In this situation, n must be an integer showing the quantization of the energy levels. [edit] Attempts at a unified field theory [edit] Relativity and quantum mechanics Even with the defining postulates of both Einstein's theory of general relativity and quantum theory being indisputably supported by rigorous and repeated empirical evidence and while they do not directly contradict each other theoretically (at least with regard to primary claims), they are resistant to being incorporated within one cohesive model.[41] Einstein himself is well known for rejecting some of the claims of quantum mechanics. While clearly contributing to the field, he did not accept the more philosophical consequences and interpretations of quantum mechanics, such as the lack of deterministic causality and the assertion that a single subatomic particle can occupy numerous areas of space at one time. He also was the first to notice some of the apparently exotic consequences of entanglement and used them to formulate the Einstein-Podolsky-Rosen paradox, in the hope of showing that quantum mechanics had unacceptable implications. This was 1935, but in 1964 it was shown by John Bell (see Bell inequality) that Einstein's assumption was correct, but had to be completed by hidden variables and thus based on wrong philosophical assumptions. According to the paper of J. Bell and the Copenhagen interpretation (the common interpretation of quantum mechanics by physicists for decades), and contrary to Einstein's ideas, quantum mechanics was neither a "realistic" theory (since quantum measurements do not state pre-existing properties, but rather they prepare properties) Gravity is negligible in many areas of particle physics, so that unification between general relativity and quantum mechanics is not an urgent issue in those applications. However, the lack of a correct theory of quantum gravity is an important issue in cosmology and physicists search for an elegant "Theory of Everything". Thus, resolving the inconsistencies between both theories has been a major goal of twentieth- and twenty-first-century physics. Many prominent physicists, including Professor Stephen Hawking, have labored in the attempt to discover a theory underlying everything, combining not only different models of subatomic physics, but also deriving the universe's four forces —the strong force, electromagnetism, weak force, and gravity— from a single force or phenomenon. One of the leading minds in this field is Edward Witten, a theoretical physicist who formulated the groundbreaking M-theory, which is an attempt at describing the supersymmetrical based string theory. [edit] Applications Quantum mechanics is important for understanding how individual atoms combine covalently to form chemicals or molecules. The application of quantum mechanics to chemistry is known as quantum chemistry. (Relativistic) quantum mechanics can in principle mathematically describe most of chemistry. Quantum mechanics can provide quantitative insight into ionic and covalent bonding processes by explicitly showing which molecules are energetically favorable to which others, and by approximately how much.[42] Most of the calculations performed in computational chemistry rely on quantum mechanics.[43] Much of modern technology operates at a scale where quantum effects are significant. Examples include the laser, the transistor, the electron microscope, and magnetic resonance imaging. The study of semiconductors led to the invention of the diode and the transistor, which are indispensable for modern electronics. Researchers are currently seeking robust methods of directly manipulating quantum states. Efforts are being made to develop quantum cryptography, which will allow guaranteed secure transmission of information. A more distant goal is the development of quantum computers, which are expected to perform certain computational tasks exponentially faster than classical computers. Another active research topic is quantum teleportation, which deals with techniques to transmit quantum states over arbitrary distances. In many devices, even the simple light switch, quantum tunneling is vital, as otherwise the electrons in the electric current could not penetrate the potential barrier made up, in the case of the light switch, of a layer of oxide. Flash memory chips found in USB drives also use quantum tunneling to erase their memory cells. [edit] Philosophical consequences The Copenhagen interpretation, due largely to the Danish theoretical physicist Niels Bohr, is the interpretation of quantum mechanics most widely accepted amongst physicists. According to it, the probabilistic nature of quantum mechanics predictions cannot be explained in terms of some other deterministic theory, and does not simply reflect our limited knowledge. Quantum mechanics provides probabilistic results because the physical universe is itself probabilistic rather than deterministic. Albert Einstein, himself one of the founders of quantum theory, disliked this loss of determinism in measurement (this dislike is the source of his famous quote, "God does not play dice with the universe."). Einstein held that there should be a local hidden variable theory underlying quantum mechanics and that, consequently, the present theory was incomplete. He produced a series of objections to the theory, the most famous of which has become known as the EPR paradox. John Bell showed that the EPR paradox led to experimentally testable differences between quantum mechanics and local realistic theories. Experiments have been performed confirming the accuracy of quantum mechanics, thus demonstrating that the physical world cannot be described by local realistic theories.[44] The Bohr-Einstein debates provide a vibrant critique of the Copenhagen Interpretation from an epistemological point of view. The Everett many-worlds interpretation, formulated in 1956, holds that all the possibilities described by quantum theory simultaneously occur in a "multiverse" composed of mostly independent parallel universes.[45] This is not accomplished by introducing some new axiom to quantum mechanics, but on the contrary by removing the axiom of the collapse of the wave packet: All the possible consistent states of the measured system and the measuring apparatus (including the observer) are present in a real physical (not just formally mathematical, as in other interpretations) quantum superposition. (Such a superposition of consistent state combinations of different systems is called an entangled state.) While the multiverse is deterministic, we perceive non-deterministic behavior governed by probabilities, because we can observe only the universe, i.e. the consistent state contribution to the mentioned superposition, we inhabit. Everett's interpretation is perfectly consistent with John Bell's experiments and makes them intuitively understandable. However, according to the theory of quantum decoherence, the parallel universes will never be accessible to us. This inaccessibility can be understood as follows: once a measurement is done, the measured system becomes entangled with both the physicist who measured it and a huge number of other particles, some of which are photons flying away towards the other end of the universe; in order to prove that the wave function did not collapse one would have to bring all these particles back and measure them again, together with the system that was measured originally. This is completely impractical, but even if one could theoretically do this, it would destroy any evidence that the original measurement took place (including the physicist's memory). [edit] See also
Quantum mind - Wikipedia, the free encyclopedia Quantum mind theories are based on the premise that quantum mechanics is necessary to fully understand the mind and brain, particularly concerning an ... Introduction - Motivation - Examples of theories - Ongoing Debate Quantum mind Contents [hide] [edit] Introduction Supporters of the quantum mind hypothesis have not submitted any evidence to support its claims for peer review, but the hypothesis has also not been falsified. As such, the hypothesis is still in its early phases. [edit] Motivation [edit] Consciousness Banished Fritjof Capra writes: To make it possible for scientists to describe nature mathematically, Galileo postulated that they should restrict themselves to studying the essential properties of material bodies—shapes, numbers, and movement—which could be measured and quantified. Other properties, like color, sound, taste, or smell, were merely subjective mental projections which should be excluded from the domain of science. [1] Proponents of the Quantum mind state that perceived qualities such as sound, taste and smell are an essential part of the human experience and therefore cannot be discounted. They posit that classical mechanics fails to account for the experience of such phenomena. Similarly, they hypothesize that the internal experiences of consciousness, such as dreaming and memory, all of which are 'part and parcel' of everyday human experience remain unaccounted for. [edit] Minimization of Mystery [edit] Examples of theories [edit] David Bohm Bohm's implicate order applies both to matter and consciousness, and he proposed that it could explain the relationship between them. Mind and matter are here seen as projections into our explicate order from the underlying reality of the implicate order. Bohm claims that when we look at the matter in space, we can see nothing in these concepts that helps us to understand consciousness. In Bohm's scheme there is a fundamental level where consciousness is not distinct from matter. Bohm's view of consciousness is connected to Karl Pribram's holographic conception of the brain [4][5][dead link]. Pribram regards sight and the other senses as lenses without which the other senses would appear as a hologram. Pribram proposes that information is recorded all over the brain, and that it is enfolded into a whole, similar to a hologram. It is suggested that memories are connected by association and manipulated by logical thought. If the brain is also receiving sensory input all these are proposed to unite in overall experience or consciousness. In trying to describe the nature of consciousness, Bohm discusses the experience of listening to music. He thinks that the feeling of movement and change that make up our experience of music derives from both the immediate past and the present being held in the brain together, with the notes from the past seen as transformations rather than memories. The notes that were implicate in the immediate past are seen as becoming explicate in the present. Bohm compares this to consciousness emerging from the implicate order. Bohm sees the movement, change or flow and also the coherence of experiences such as listening to music as a manifestation of the implicate order. He claims to derive evidence for this from the work of Piaget[6] in studying infants. He claims that these studies show that young children have to learn about time and space, because they are part of the explicate order, but have a 'hard-wired' understanding of movement because it is part of the implicate order. He compares this 'hard-wiring' to Chomsky's theory that grammar is 'hard-wired' into young human brains. In his writings, Bohm never proposed any specific brain mechanism by which his implicate order could emerge in a way that was relevant to consciousness. [edit] Gustav Bernroider Bernroider bases his work on recent studies of the potassium (K+)ion channel in its closed state and draws particularly on the atomic-level spectroscopy work of the MacKinnon group [9][10][11][12][13]. The ion channels have a filter region which allows in K+ ions and bars other ions. These studies show that the filter region has a framework of five sets of four oxygen atoms, which are part of the carboxyl group of amino-acid molecules in the surrounding protein. These are referred to as binding pockets. Two K+ ions are trapped in the selection filter of the closed ion channel. Each of these ions is electrostatically bound to two sets of oxygen atoms or binding pockets, involving eight oxygen atoms in total. Both ions in the channel oscillate between two configurations. Bernroider uses this recently revealed structure to speculate about the possibility of quantum coherence in the ion channels. Bernroider and co-author Sisir Roy's calculations suggested to them that the behaviour of the ions in the K channel could only be understood at the quantum level. Taking this as their starting point, they then ask whether the structure of the ion channel can be related to logic states. Further calculations lead them to suggest that the K+ ions and the oxygen atoms of the binding pockets are two quantum-entangled sub-systems, which they then equate to a quantum computational mapping. The ions that are destined to be expelled from the channel are proposed to encode information about the state of the oxygen atoms. It is further proposed the separate ion channels could be quantum entangled with one another. [edit] David Chalmers One possibility is that instead of postulating novel properties, physics might end up appealing to consciousness itself, in the way that some theorists but not all, hold that quantum mechanics does. [14] The collapse dynamics leaves a door wide open for an interactionist interpretation. [15] The most promising version of such an interpretation allows conscious states to be correlated with the total quantum state of a system, with the extra constraint that conscious states (unlike physical states) can never be superposed. In a conscious physical system such as a brain, the physical and phenomenal states of the system will be correlated in a (nonsuperposed) quantum state. Upon observation of a superposed external system, Schrödinger evolution at the moment of observation would cause the observed system to become correlated with the brain, yielding a resulting superposition of brain states and so (by psychophysical correlation) a superposition of conscious states. But such a superposition cannot occur, so one of the potential resulting conscious states is somehow selected (presumably by a nondeterministic dynamic principle at the phenomenal level). The result is that (by psychophysical correlation) a definite brain state and a definite state of the observed object are also selected. [16] If physics is supposed to rule out interactionism, then careful attention to the detail of physical theory is required. [17] [edit] Roger Penrose 1. Humans have abilities, particularly mathematical ones, that no algorithmic computer (specifically Turing machine) could have, because computers are limited by Gödel's incompleteness theorem. In other words, he believes humans are hypercomputers. (The argument was originally due to John Lucas.) Gödel demonstrated that with any recursively enumerable set of axioms capable of expressing Peano arithmetic, it was possible to produce a statement that was obviously true, but could not be proved by the axioms. The theorem enjoys general acceptance in the mathematical community[18]. Penrose, however, built a further and highly controversial argument on this theorem. He argued that the theorem showed that the brain had the ability to go beyond what can be demonstrated by mathematical axioms, and therefore there is something within the functioning of the brain that is not based on an algorithm (a system of calculations). A computer is just a system of algorithms, and Penrose claimed that Gödel's theorem demonstrated that brains could perform functions that no computer could perform. Penrose is not interested in explaining phenomenal consciousness, qualia, generally regarded as the most mysterious feature of consciousness, but instead focuses mainly on the cognitive powers of mathematicians. These assertions have been vigorously contested by many critics and notably by the philosophers Churchland and Grush[19][20]. The theory has been much criticised [21] [22] [23]. 2. This would require some new physics. Penrose postulates that the currently unknown process underlying quantum collapse supplies the non-algorithmic element. The random choice of, for instance, the position of a particle, which is involved in the collapse of the wave function was the only physical process that Penrose could find, which was not based on an algorithm. However, randomness was not a promising basis for the quality of mathematical judgement highlighted by his Gödel theorem argument. But Penrose went on to propose that when the wave function did not collapse as a result of a measurement or decoherence in the environment, there could be an alternative form of wave function collapse, which he called objective reduction (OR). In this, each quantum superposition has its own space time geometry. When these become separated by more than the Planck length, they are affected by gravity, become unstable and collapse. OR is strikingly different both from the traditional orthodoxy of Niels Bohr's Copenhagen interpretation of quantum theory and from some more modern theories which avoid wave function collapse altogether such as Many-worlds interpretation or some forms of Quantum decoherence theory. Penrose further proposes that OR is neither random nor governed by an algorithm, but is 'non-computational', selecting information embedded in the fundamental level of space time geometry. 3. Collapse requires a coherent superposed state to work on. Penrose borrows Stuart Hameroff's proposal about microtubules to supply this. Initially, Penrose had lacked any detailed proposals for how OR could occur in the brain. Later on cooperation with Stuart Hameroff [24] supplied this side of the theory. Microtubules were central to Hameroff's proposals. These are the core element of the cytoskeleton, which provides a supportive structure and performs various functions in body cells. In additions to these functions, it was now proposed that the microtubules could support macroscopic quantum features known as Bose-Einstein condensates. It was also suggested that these condensates could link with other neurons via gap junctions. This is claimed to permit quantum coherence to extend over a large area of the brain. It is suggested that when one of these areas of quantum coherence collapses, there is an instance of consciousness, and the brain has access to a non-computational process embedded in the fundamental level of space time geometry. At the same time, it was postulated that conventional synaptic activity influences and is influenced by the activity in the microtubules. This part of the process is referred to as 'orchestration' hence the theory is called Orchestrated Objective Reduction or more commonly Orch OR. Hameroff's proposals like those of Penrose attracted much criticism. However the most cogent attack on Orch OR and quantum mind theories in general was the view that conditions in the brain would lead to any quantum coherence decohering too quickly for it to be relevant to neural processes. This general criticism is discussed in the Science section below. [edit] Evan Harris Walker Information theory is concerned with the measurement of information in terms of logarithmic probability—how many bits of information does it take to represent a certain type of information such as, let’s say, the letter “T” in print. Since we don’t know all the possible permutations or “combinations” of such a question we use statistical probability in order to be very accurate in our measurements. We add up all the logarithmic contributions of each possible symbol being measured in terms of its chance of occurrence. It is expressed as log₂P. This gives us an informational field potential. A physicist, Evan Harris Walker developed a scientific theory about how the brain might, at quantum levels, process information. In his book, The Physics of Consciousness, he adds log₂P to Schrödinger’s equation. What he demonstrates mathematically is that when information is measured by consciousness and will channel capacities in terms of a closed loop, it forces one real solution only when one probable state happens and all other possible states disappear. He offers/proposes physical evidence that this process is occurring in the brain. [edit] Henry Stapp Stapp envisages consciousness as exercising top-level control over neural excitation in the brain. Quantum brain events are suggested to occur at the whole brain level, and are seen as being selected from the large-scale excitation of the brain. The neural excitations are viewed as a code, and each conscious experience as a selection from this code. The brain, in this theory, is proposed to be a self-programming computer with a self-sustaining input from memory, which is itself a code derived from previous experience. This process results in a number of probabilities from which consciousness has to select. The conscious act is a selection of a piece of top-level code, which then exercises ongoing control over the flow of neural excitation. This process refers to the top levels of brain activity involved with information gathering, planning and the monitoring of the execution of plans. Conscious events are proposed to be capable of grasping a whole pattern of activity, thus accounting for the unity of consciousness, and providing a solution to the 'binding problem'. Stapp's version of the conscious brain is proposed to be a system that is internally determined in a way that cannot be represented outside the system, whereas for the rest of the physical universe an external representation plus a knowledge of the laws of physics allows an accurate prediction of future events. Stapp proposes that the proof of his theory requires the identification of the neurons that provide the top-level code and also the process by which memory is turned into additional top-level code. [edit] Quantum Brain Dynamics Frohlich is the source of the idea that quantum coherent waves could be generated in the neuronal network. Frohlich argued that it was not clear how order could be sustained in living systems given the disruptive influence of the fluctuations in biochemical processes. He viewed the electric potential across the neuron membrane as the observable feature of some form of underlying quantum order. His studies claimed to show that with an oscillating charge in a thermal bath, large numbers of quanta may condense into a single state known as a Bose condensate. This state allows long-range correlation amongst the dipoles involved. Further to this, biomolecules were proposed to line up along actin filaments (part of the cytoskeleton) and dipole oscillations propagate along the filaments as quantum coherent waves. This now has some experimental support in the form of confirmation that biomolecules with high electric dipole moment have been shown to have a periodic oscillation[31]. Vitiello also argues that the ordered chains of chemical reactions on which biological tissues depend would collapse without some form of quantum ordering, which in QBD is described by quantum field theory rather than quantum mechanics. Vitiello provides citations, which are claimed to support his view of biological tissue. These include studies of radiation effect on cell growth[32],response to external stimuli[33], non-linear tunnelling[34],coherent nuclear motion in membrane proteins[35],optical coherence in biological systems[36], energy transfer via solitons and coherent excitations[37]. QBD proposes that the cortical field not only interacts with, but also to a good extent controls the neuronal network. It suggests that biomolecular waves propagate along the actin filaments in the area of the cell membranes and dendritic spines. The waves derive energy from ATP molecules stored in the cell membrane and control the ion channels, which in turn regulate the flow of signals to the synapses. Vitiello claims that QBD does not require quantum oscillations to last as long as the actual time to decoherence. The proponents of QBD differ somewhat as the exact way in which it produces consciousness. Jibu and Yasue think that the interaction between the energy quanta of the cortical field and the biomolecular waves of the neuronal network, particularly the dendritic part of the network, is what produces consciousness. On the other hand, Vitiello thinks that the quantum states involved in QBD produce two poles, a subjective representation of the external world and a self. This self opens itself to the representation of the external world. Consciousness is, in this theory, not in either the self or the external representation, but between the two in the opening of one to the other. [edit] Quantum Evidence Physicists at the University of California, Berkeley believe they have discovered that green plants perform quantum computation in order to capture the sun's light through photosynthesis—evidence of quantum coherence in a living system.[38] Stuart Hameroff noted, in October 2000, that quantum coherence—although, by its mere occurrence in the brain not sufficient to prove its supposed central role in consciousness—had nevertheless been observed. This, he claimed, was significant because so much of the criticism of his model had "come under sharp criticism due to the issue of decoherence, and the question of whether quantum processes of significance can exist in the brain at physiological temperature." (Quantum Mind archives, October 2000 - (11.)) [edit] Ongoing Debate [edit] Science In quantum terms each neuron is an essentially classical object. Consequently quantum noise in the brain is at such a low level that it probably doesn't often alter, except very rarely, the critical mechanistic behaviour of sufficient neurons to cause a decision to be different than we might otherwise expect. (...) —Michael Clive Price[1] One well-known critic of the quantum mind is Max Tegmark. Based on his calculations, Tegmark concluded that quantum systems in the brain decohere quickly and cannot control brain function, "This conclusion disagrees with suggestions by Penrose and others that the brain acts as a quantum computer, and that quantum coherence is related to consciousness in a fundamental way"[39] Proponents of quantum consciousness theories have sought to defend them against Tegmark's criticism. In respect of QBD, Vitiello has argued that Tegmark's work applies to theories based on quantum mechanics but not to those such as QBD that are based on quantum field theory. In respect of Penrose and Hameroff's Orch OR theory, Hameroff along with Hagan and Tuszynski replied to Tegmark[40]. They claimed that Tegmark based his calculations on a model that was different from Orch OR. It is argued that in the Orch OR model the microtubules are shielded from decoherence by ordered water. Energy pumping as a result of thermal disequilibrium, Debye layer screening and quantum error correction, deriving from the geometry of the microtubule lattice are also proposed as possible sources of shielding. Similarly, in his extension of Bohm's ideas, Bernroider has claimed that the binding pockets in the ion selection filters could protect against decoherence[41]. So far, however, there has been no experimental confirmation of the ability of the features mentioned above to protect against decoherence. [edit] Philosophy As David Chalmers puts it: Nevertheless, quantum theories of consciousness suffer from the same difficulties as neural or computational theories. Quantum phenomena have some remarkable functional properties, such as nondeterminism and nonlocality. It is natural to speculate that these properties may play some role in the explanation of cognitive functions, such as random choice and the integration of information, and this hypothesis cannot be ruled out a priori. But when it comes to the explanation of experience, quantum processes are in the same boat as any other. The question of why these processes should give rise to experience is entirely unanswered. [2] Other philosophers, such as Patricia and Paul Churchland and Daniel Dennett[43] reject the idea that there is anything puzzling about consciousness in the first place. [edit] See also [edit] References [edit] Further reading [edit] External links
THE ATOM THE
THE ATUM THE
THE HERMETICA THE LOST WISDOM OF THE PHARAOHS Timothy Freke & Peter Gandy To the Memory of Giordano Bruno 1548 - 1600 Mundus Nihil Pulcherrimum The World is a Beautiful Nothing Page 23 "Although we have used the familiar term 'God' in the explanatory notes which accompany each chapter, we have avoided this term in the text itself. Instead we have used 'Atum - one of the ancient Egyptian names for the Supreme One God."
Page 45 The Being of Atum "Atum is Primal Mind."
Page 45 The Being of Atum Give me your whole awareness, and concentrate your thoughts, for Knowledge of Atum's Being requires deep insight, which comes only as a gift of grace. It is like a plunging torrent of water whose swiftness outstrips any man who strives to follow it, leaving behind not only the hearer, but even the teacher himself. To conceive of Atum is difficult. To define him is impossible. The imperfect and impermanent cannot easily apprehend the eternally perfected. Atum is whole and conconstant. In himself he is motionless, yet he is self-moving. He is immaculate, incorruptible and ever-lasting. He is the Supreme Absolute Reality. He is filled with ideas which are imperceptible to the senses, and with all-embracing Knowledge. Atum is Primal Mind. Page 46 He is too great to be called by the name 'Atum'. He is hidden, yet obvious everywhere. His Being is known through thought alone, yet we see his form before our eyes. He is bodiless, yet embodied in everything. There is nothing which he is not. He has no name, because all names are his name. He is the unity in all things, so we must know him by all names and call everything 'Atum'. He is the root and source of all. Everything has a source, except this source itself, which springs from nothing. Atum is complete like the number one, which remains itself whether multiplied or divided, and yet generates all numbers. Atum is the Whole which contains everything. He is One, not two. He is All, not many. The All is not many separate things, but the Oneness that subsumes the parts. The All and the One are identical. You think that things are many when you view them as separate, but when you see they all hang on the One, /Page 47/ and flow from the One, you will realise they are unitedlinked together, and connected by a chain of Being from the highest to the lowest, all subject to the will of Atum. The Cosmos is one as the sun is one, the moon is one and the Earth is one. Do you think there are many Gods? That's absurd - God is one. Atum alone is the Creator of all that is immortal, and all that is mutable. If that seems incredible, just consider yourself. You see, speak, hear, touch, taste, walk, think and breathe. It is not a different you who does these various things, but one being who does them all. To understand how Atum makes all things, consider a farmer sowing seeds;
here wheat - there barley, Just as the same man plants all these seeds, so Atum sows immortality in heaven and change on Earth. Throughout the Cosmos he disseminates Life and movementthe two great elements that comprise Atum and his creation, and so everything that is. Page 48 Atum is called 'Father' because he begets all things, and, from his example, the wise hold begetting children the most sacred pursuit of human life. Atum works with Nature, within the laws of Necessity, causing extinction and renewal, constantly creating creation to display his wisdom. Yet, the things that the eye can see are mere phantoms and illusions. Only those things invisible to the eye are real. Above all are the ideas of Beauty and Goodness. Just as the eye cannot see the Being of Atum, so it cannot see these great ideas. They are attributes of Atum alone, and are inseparable from him. They are so perfectly without blemish that Atum himself is in love with them. There is nothing which Atum lacks, so nothing that he desires. There is nothing that Atum can lose, so nothing can cause him grief. Atum is everything. Atum makes everything, and everything is a part of Atum. Atum, therefore, makes himself. This is Atum's glory - he is all-creative, and this creating is his very Being. It is impossible for him ever to stop creatingfor Atum can never cease to be. Page 49 Atum is everywhere. Mind cannot be enclosed, because everything exists within Mind. Nothing is so quick and powerful. Just look at your own experience. Imagine yourself in any foreign land, and quick as your intention you will be there! Think of the ocean - and there you are. You have not moved as things move, but you have travelled, nevertheless. Fly up into the heavens - you won't need wings! Nothing can obstruct you - not the burning heat of the sun, or the swirling planets. Pass on to the limits of creation. Do you want to break out beyond the boundaries of the Cosmos? For your mind, even that is possible. Can you sense what power you possess? If you can do all this, then what about your Creator? Try and understand that Atum is Mind. This is how he contains the Cosmos. All things are thoughts which the Creator thinks."
Kaleidoscope - Wikipedia, the free encyclopedia Contents [hide] [edit] Etymology More informatively: "looking at beautiful forms" (PB May 2009) [edit] History In America, Charles Bush popularized the kaleidoscope. Today, these early products often sell for over $1,000. Cozy Baker collected kaleidoscopes and wrote books about a few of the artists who were making them in the 1970s through 2000. Baker is credited with energizing a renaissance in kaleidoscope-making in America. In 1999 a short lived magazine dedicated to kaleidoscopes called Kaleidoscope Review was published covering artists, collectors, dealers, events, and how-to articles. This magazine was created and edited by Brett Bensley, at that time a well known kaleidoscope artist and resource on kaleidoscope information. Craft galleries often carry a few, while others specialize in them and carry dozens of different types from different artists and craftspeople. Kaleidoscopes are related to hyperbolic geometry. [edit] Background [edit] See also [edit] References [edit] External links
MIN DOTH DREAM WHAT DOTH MIN MEAN
|